Story - CAS - Science and Religion

Professor Joseph Vukov, center, conducts a course on virtual reality from both a technological and philosophical perspective. (Photo: Lukas Keapproth)
Bridging ethics and technology: Loyola's pioneering approach to humanizing AI and VR education
Professors take a cross-disciplinary approach as they teach students to think critically about the use of these emerging tools
Virtual reality (VR) and artificial intelligence (AI) are changing how we learn, work, and interact. As these technological possibilities rapidly expand, so do existential questions. How do we envision the future of life on Earth? And how do we prepare students to meet inevitable challenges?
Joseph Vukov, an associate professor of philosophy, and Michael Burns, an associate professor of biology, built the course “Philosophy and Biology for the Future” to explore these ever-evolving issues with the goal of creating a more just world. They designed the course to help students understand and envision the future through the intersecting lenses of the biological sciences, ethics, and faith.
The course's site poses a challenging thesis: “If you haven't noticed, humanity has some problems. Big problems. Climate change, genetic engineering, the rise of Big Tech, technological turmoil in health care, the advent of artificial intelligence. And that's just the start. To tackle these problems, we're going to need the best minds, equipped with the latest resources provided by scientific, philosophical, and religious perspectives. And we're going to have to learn how to talk with others: even (and especially) those with whom we disagree.”
To put it more generally, says Burns, “We’re both educators who saw the need to teach our students about near-future issues where we will have to make decisions about what our society and community will look like.”
Using innovation to engage locally and virtually
In 2022, the interdisciplinary duo began to focus on VR, giving students hands-on experiences with artificial reality, virtual reality, and the metaverse.
“We see the value of learning by doing. We got the students plugged in with headsets and met for class in the metaverse that day,” says Vukov. “This space allowed students to discuss the technology as they used it. How does the headset work? Is it accessible or comfortable to use? What kind of psychological headspace does it put the user in?”
Course topics range from design to accessibility to church. “Students met with members of an interfaith VR church,” says Vukov. “Afterwards, we asked them to reflect on the experience. Is VR good for building community? Is it better than real life? What opportunities or outlets does it provide for people who aren’t able to physically participate in society, such as those with health issues?”
Vukov says the course speaks to something distinct and unique about Loyola. “Our students and faculty go out and embrace the community,” even if it’s virtual. “We’re in dialogue with communities as we learn.”

Loyola University Chicago
The Pursuit of Purpose
What can be learned from Loyola’s 153-year track record on social impact
Read the story
Stritch School of Medicine
The art and science of cells: Loyola's innovative approach to understanding cellular behavior and health
An interdisciplinary team seeks to understand cell movement.
Read the Story
School of Environmental Sustainability
Loyola leads the way in fighting hunger and food waste in Chicago
Initiatives help the environment and the local community.
Read the storyBiology meets technology
In a class called “Evolution, Machine Learning, and Artificial Intelligence,” Burns uses ChatGPT as a framework to talk about biology. “The way these machine learning models are created and trained is lifted directly from evolutionary biology and neuroscience. That means we can use what I know as a biologist to help our students understand this complicated AI,” says Burns.
ChatGPT’s model favors human-like language. But what it doesn’t yet favor is generating accurate language or critical thinking on nuanced topics, especially regarding hot-button issues. “For well documented, noncontentious topics, it will spit out truth,” Burns says. “But when you push the model further into the weeds, like conspiracy theories, it can get crazy. It will make things up just so it can provide you with an answer.”
The soul of the future
Vukov, for his part, leads his students in exploring the ethics of how ChatGPT is deployed and shows up in the world. “Is there bias in how it’s trained, resulting in bias in what is produced? Does it cheat or plagiarize? What is good versus poor use? What is it helping with? What is it taking from us? And what does authentic creativity look like?” There are practical applications, he points out in one course, in areas like AI and robotics in elder care.
In addition to ethics, Vukov leaned into his expertise in Catholic intellectual heritage to challenge his students. “If the Catholic tradition believes that humans have a bodily presence, what do AI and VR mean for human dignity? How does it impact or lead us to ask new questions about what it means to be human?”
Burns and Vukov will continue to help students answer questions like these. In 2023, the National Endowment for the Humanities gave the professors a two-year grant of almost $150,000 for Humanizing STEM Education: Navigating Future Challenges Through Integrated Instruction to turn their work into a curriculum to prepare a new generation of students for what lies even further ahead.
Biology meets technology
In a class called “Evolution, Machine Learning, and Artificial Intelligence,” Burns uses ChatGPT as a framework to talk about biology. “The way these machine learning models are created and trained is lifted directly from evolutionary biology and neuroscience. That means we can use what I know as a biologist to help our students understand this complicated AI,” says Burns.
ChatGPT’s model favors human-like language. But what it doesn’t yet favor is generating accurate language or critical thinking on nuanced topics, especially regarding hot-button issues. “For well documented, noncontentious topics, it will spit out truth,” Burns says. “But when you push the model further into the weeds, like conspiracy theories, it can get crazy. It will make things up just so it can provide you with an answer.”
The soul of the future
Vukov, for his part, leads his students in exploring the ethics of how ChatGPT is deployed and shows up in the world. “Is there bias in how it’s trained, resulting in bias in what is produced? Does it cheat or plagiarize? What is good versus poor use? What is it helping with? What is it taking from us? And what does authentic creativity look like?” There are practical applications, he points out in one course, in areas like AI and robotics in elder care.
In addition to ethics, Vukov leaned into his expertise in Catholic intellectual heritage to challenge his students. “If the Catholic tradition believes that humans have a bodily presence, what do AI and VR mean for human dignity? How does it impact or lead us to ask new questions about what it means to be human?”
Burns and Vukov will continue to help students answer questions like these. In 2023, the National Endowment for the Humanities gave the professors a two-year grant of almost $150,000 for Humanizing STEM Education: Navigating Future Challenges Through Integrated Instruction to turn their work into a curriculum to prepare a new generation of students for what lies even further ahead.